Neural networks optimally trained with noisy data
نویسندگان
چکیده
منابع مشابه
Optimally adapted multi-state neural networks trained with noise
The principle of adaptation in a noisy retrieval environment is extended here to a diluted attractor neural network of Q-state neurons trained with noisy data. The network is adapted to an appropriate noisy training overlap and training activity, which are determined self-consistently by the optimized retrieval attractor overlap and activity. The optimized storage capacity and the corresponding...
متن کاملInstantaneously Trained Neural Networks
This paper presents a review of instantaneously trained neural networks (ITNNs). These networks trade learning time for size and, in the basic model, a new hidden node is created for each training sample. Various versions of the cornerclassification family of ITNNs, which have found applications in artificial intelligence (AI), are described. Implementation issues are also considered.
متن کاملUsing Neural Networks with Limited Data to Estimate Manufacturing Cost
Neural networks were used to estimate the cost of jet engine components, specifically shafts and cases. The neural network process was compared with results produced by the current conventional cost estimation software and linear regression methods. Due to the complex nature of the parts and the limited amount of information available, data expansion techniques such as doubling-data and data-cr...
متن کاملStochastic Reservoir Simulation Using Neural Networks Trained on Outcrop Data
Extensive outcrop data or photographs of present day depositions or even simple drawings from expert geologists contain precious structural information about spatial continuity that is beyond the present tools of geostatistics essentially limited to two-point statistics (histograms and covariances). A neural net can be learned to collect multiple point statistics from various training images, t...
متن کاملNoisy Neural Networks and Generalizations
In this paper we define a probabilistic computational model which generalizes many noisy neural network models, including the recent work of Maass and Sontag [5]. We identify weak ergodicjty as the mechanism responsible for restriction of the computational power of probabilistic models to definite languages, independent of the characteristics of the noise: whether it is discrete or analog, or i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical Review E
سال: 1993
ISSN: 1063-651X,1095-3787
DOI: 10.1103/physreve.47.4465